64 research outputs found

    Empathy for the devil : the poetics of identification in psychopath fiction

    Get PDF
    As Philip L. Simpson notes, humankind has an ‘ongoing...fascination with tales of gruesome murders and evil villains’ (15). Popular culture abounds with depictions of the mad and the bad; and perhaps no single disorder holds as much morbid appeal as psychopathy, the baffling condition which combines what Hervey M. Cleckley terms a ‘mask of sanity’, with a seeming lack of the qualities usually deemed to constitute humanity. My thesis focuses on how authors have sought to explain, interpret and understand the psychopathic individual, and explores how literary techniques have manipulated readers’ responses to the moral questions posed by psychopathic characters. Between the mid-nineteenth century and the present day, authors have increasingly used empathetic narrative techniques to encourage readers to identify with and accept the villains whose stories they so voraciously consume. I track the transitions in narrative style, structure and form which take us from depictions of the psychopath as fiendish ‘other’, for example Rigaud in Charles Dickens’s Little Dorrit, to modern portrayals of the psychopathic murderer as hero, as seen in Jeff Lindsay’s series of Dexter novels.I consider what the reader gains from reading such material and how we as readers negotiate the paradox of empathising with characters who are themselves incapable of empathy. I also explore whether cultural fascination with the psychopath is based on a desire to understand the workings of the psychopathic mind, a perverse delight in our fear of the aberrant ‘other’, or whether it reveals something altogether darker and more disturbing about ourselves

    Salford postgraduate annual research conference (SPARC) 2012 proceedings

    Get PDF
    These proceedings bring together a selection of papers from the 2012 Salford Postgraduate Annual Research Conference (SPARC). They reflect the breadth and diversity of research interests showcased at the conference, at which over 130 researchers from Salford, the North West and other UK universities presented their work. 21 papers are collated here from the humanities, arts, social sciences, health, engineering, environment and life sciences, built environment and business

    Should patients with abnormal liver function tests in primary care be tested for chronic viral hepatitis: cost minimisation analysis based on a comprehensively tested cohort

    Get PDF
    Background Liver function tests (LFTs) are ordered in large numbers in primary care, and the Birmingham and Lambeth Liver Evaluation Testing Strategies (BALLETS) study was set up to assess their usefulness in patients with no pre-existing or self-evident liver disease. All patients were tested for chronic viral hepatitis thereby providing an opportunity to compare various strategies for detection of this serious treatable disease. Methods This study uses data from the BALLETS cohort to compare various testing strategies for viral hepatitis in patients who had received an abnormal LFT result. The aim was to inform a strategy for identification of patients with chronic viral hepatitis. We used a cost-minimisation analysis to define a base case and then calculated the incremental cost per case detected to inform a strategy that could guide testing for chronic viral hepatitis. Results Of the 1,236 study patients with an abnormal LFT, 13 had chronic viral hepatitis (nine hepatitis B and four hepatitis C). The strategy advocated by the current guidelines (repeating the LFT with a view to testing for specific disease if it remained abnormal) was less efficient (more expensive per case detected) than a simple policy of testing all patients for viral hepatitis without repeating LFTs. A more selective strategy of viral testing all patients for viral hepatitis if they were born in countries where viral hepatitis was prevalent provided high efficiency with little loss of sensitivity. A notably high alanine aminotransferase (ALT) level (greater than twice the upper limit of normal) on the initial ALT test had high predictive value, but was insensitive, missing half the cases of viral infection. Conclusions Based on this analysis and on widely accepted clinical principles, a "fast and frugal" heuristic was produced to guide general practitioners with respect to diagnosing cases of viral hepatitis in asymptomatic patients with abnormal LFTs. It recommends testing all patients where a clear clinical indication of infection is present (e.g. evidence of intravenous drug use), followed by testing all patients who originated from countries where viral hepatitis is prevalent, and finally testing those who have a notably raised ALT level (more than twice the upper limit of normal). Patients not picked up by this efficient algorithm had a risk of chronic viral hepatitis that is lower than the general population

    Applying Recent Argumentation Methods to Some Ancient Examples of Plausible Reasoning

    Get PDF
    Plausible (eikotic) reasoning known from ancient Greek (late Academic) skeptical philosophy is shown to be a clear notion that can be analyzed by argu- mentation methods, and that is important for argumentation studies. It is shown how there is a continuous thread running from the Sophists to the skeptical philosopher Carneades, through remarks of Locke and Bentham on the subject, to recent research in artificial intelligence. Eleven characteristics of plausible reasoning are specified by analyzing key examples of it recognized as important in ancient Greek skeptical philosophy using an artificial intelligence model called the Carneades Argumentation System (CAS). By applying CAS to ancient examples it is shown how plausible reasoning is especially useful for gaining a better understanding of evidential reasoning in law, and argued that it can also be applied to everyday argumentation. Our analysis of the snake and rope example of Carneades is also used to point out some ways CAS needs to be extended if it is to more fully model the views of this ancient philosopher on argumentation

    Evaluation of fast atmospheric dispersion models in a regular street network

    Get PDF
    The need to balance computational speed and simulation accuracy is a key challenge in designing atmospheric dispersion models that can be used in scenarios where near real-time hazard predictions are needed. This challenge is aggravated in cities, where models need to have some degree of building-awareness, alongside the ability to capture effects of dominant urban flow processes. We use a combination of high-resolution large-eddy simulation (LES) and wind-tunnel data of flow and dispersion in an idealised, equal-height urban canopy to highlight important dispersion processes and evaluate how these are reproduced by representatives of the most prevalent modelling approaches: (i) a Gaussian plume model, (ii) a Lagrangian stochastic model and (iii) street-network dispersion models. Concentration data from the LES, validated against the wind-tunnel data, were averaged over the volumes of streets in order to provide a high-fidelity reference suitable for evaluating the different models on the same footing. For the particular combination of forcing wind direction and source location studied here, the strongest deviations from the LES reference were associated with mean over-predictions of concentrations by approximately a factor of 2 and with a relative scatter larger than a factor of 4 of the mean, corresponding to cases where the mean plume centreline also deviated significantly from the LES. This was linked to low accuracy of the underlying flow models/parameters that resulted in a misrepresentation of pollutant channelling along streets and of the uneven plume branching observed in intersections. The agreement of model predictions with the LES (which explicitly resolves the turbulent flow and dispersion processes) greatly improved by increasing the accuracy of building-induced modifications of the driving flow field. When provided with a limited set of representative velocity parameters, the comparatively simple street-network models performed equally well or better compared to the Lagrangian model run on full 3D wind fields. The study showed that street-network models capture the dominant building-induced dispersion processes in the canopy layer through parametrisations of horizontal advection and vertical exchange processes at scales of practical interest. At the same time, computational costs and computing times associated with the network approach are ideally suited for emergency-response applications

    Deterrence in Cyberspace: An Interdisciplinary Review of the Empirical Literature

    Get PDF
    The popularity of the deterrence perspective across multiple scientific disciplines has sparked a lively debate regarding its relevance in influencing both offenders and targets in cyberspace. Unfortunately, due to the invisible borders between academic disciplines, most of the published literature on deterrence in cyberspace is confined within unique scientific disciplines. This chapter therefore provides an interdisciplinary review of the issue of deterrence in cyberspace. It begins with a short overview of the deterrence perspective, presenting the ongoing debates concerning the relevance of deterrence pillars in influencing cybercriminals’ and cyberattackers’ operations in cyberspace. It then reviews the existing scientific evidence assessing various aspects of deterrence in the context of several disciplines: criminology, law, information systems, and political science. This chapter ends with a few policy implications and proposed directions for future interdisciplinary academic research

    Targeted Next-Generation Sequencing Analysis of 1,000 Individuals with Intellectual Disability.

    Get PDF
    To identify genetic causes of intellectual disability (ID), we screened a cohort of 986 individuals with moderate to severe ID for variants in 565 known or candidate ID-associated genes using targeted next-generation sequencing. Likely pathogenic rare variants were found in ∼11% of the cases (113 variants in 107/986 individuals: ∼8% of the individuals had a likely pathogenic loss-of-function [LoF] variant, whereas ∼3% had a known pathogenic missense variant). Variants in SETD5, ATRX, CUL4B, MECP2, and ARID1B were the most common causes of ID. This study assessed the value of sequencing a cohort of probands to provide a molecular diagnosis of ID, without the availability of DNA from both parents for de novo sequence analysis. This modeling is clinically relevant as 28% of all UK families with dependent children are single parent households. In conclusion, to diagnose patients with ID in the absence of parental DNA, we recommend investigation of all LoF variants in known genes that cause ID and assessment of a limited list of proven pathogenic missense variants in these genes. This will provide 11% additional diagnostic yield beyond the 10%-15% yield from array CGH alone.Action Medical Research (SP4640); the Birth Defect Foundation (RG45448); the Cambridge National Institute for Health Research Biomedical Research Centre (RG64219); the NIHR Rare Diseases BioResource (RBAG163); Wellcome Trust award WT091310; The Cell lines and DNA bank of Rett Syndrome, X-linked mental retardation and other genetic diseases (member of the Telethon Network of Genetic Biobanks (project no. GTB12001); the Genetic Origins of Congenital Heart Disease Study (GO-CHD)- funded by British Heart Foundation (BHF)This is the final version of the article. It first appeared from Wiley via http://dx.doi.org/10.1002/humu.2290
    • …
    corecore